Mixtures of Experts Estimate A Posteriori Probabilities
نویسنده
چکیده
The mixtures of experts (ME) model ooers a modular structure suitable for a divide-and-conquer approach to pattern recognition. It has a probabilistic interpretation in terms of a mixture model, which forms the basis for the error function associated with MEs. In this paper, it is shown that for classiication problems the minimization of this ME error function leads to ME outputs estimating the a posteriori probabilities of class membership of the input vector.
منابع مشابه
Mixtures of Experts Estimate a Posteriori Probabilities Mixtures of Experts Estimate a Posteriori Probabilities
The mixtures of experts (ME) model ooers a modular structure suitable for a divide-and-conquer approach to pattern recognition. It has a probabilistic interpretation in terms of a mixture model, which forms the basis for the error function associated with MEs. In this paper, it is shown that for classiication problems the minimization of this ME error function leads to ME outputs estimating the...
متن کاملALGEBRAIC NONLINEARITY IN VOLTERRA-HAMMERSTEIN EQUATIONS
Here a posteriori error estimate for the numerical solution of nonlinear Voltena- Hammerstein equations is given. We present an error upper bound for nonlinear Voltena-Hammastein integral equations, in which the form of nonlinearity is algebraic and develop a posteriori error estimate for the recently proposed method of Brunner for these problems (the implicitly linear collocation method)...
متن کاملA Self-organized Multi Agent Decision Making System Based on Fuzzy Probabilities: The Case of Aphasia Diagnosis
Aphasia diagnosis is a challenging medical diagnostic task due to the linguistic uncertainty and vagueness, large number of measurements with imprecision, inconsistencies in the definition of Aphasic syndromes, natural diversity and subjectivity in test objects as well as in options of experts who diagnose the disease. In this paper we present a new self-organized multi agent system that diagno...
متن کاملTowards EM-style Algorithms for a posteriori Optimization of Normal Mixtures
Expectation maximization (EM) provides a simple and elegant approach to the problem of optimizing the parameters of a normal mixture on an unlabeled dataset. To accomplish this, EM iteratively reweights the elements of the dataset until a locally optimal normal mixture is obtained. This paper explores the intriguing question of whether such an EM-style algorithm exists for the related and appar...
متن کاملNeural Network Classifiers Estimate Bayesian a posteriori Probabilities
Many neural network classifiers provide outputs which estimate Bayesian a posteriori probabilities. When the estimation is accurate, network outputs can be treated as probabilities and sum to one. Simple proofs show that Bayesian probabilities are estimated when desired network outputs are 2 of M (one output unity, all others zero) and a squarederror or cross-entropy cost function is used. Resu...
متن کامل